174 research outputs found

    Spontaneous and stimulus-induced coherent states of critically balanced neuronal networks

    Get PDF
    How the information microscopically processed by individual neurons is integrated and used in organizing the behavior of an animal is a central question in neuroscience. The coherence of neuronal dynamics over different scales has been suggested as a clue to the mechanisms underlying this integration. Balanced excitation and inhibition may amplify microscopic fluctuations to a macroscopic level, thus providing a mechanism for generating coherent multiscale dynamics. Previous theories of brain dynamics, however, were restricted to cases in which inhibition dominated excitation and suppressed fluctuations in the macroscopic population activity. In the present study, we investigate the dynamics of neuronal networks at a critical point between excitation-dominant and inhibition-dominant states. In these networks, the microscopic fluctuations are amplified by the strong excitation and inhibition to drive the macroscopic dynamics, while the macroscopic dynamics determine the statistics of the microscopic fluctuations. Developing a novel type of mean-field theory applicable to this class of interscale interactions, we show that the amplification mechanism generates spontaneous, irregular macroscopic rhythms similar to those observed in the brain. Through the same mechanism, microscopic inputs to a small number of neurons effectively entrain the dynamics of the whole network. These network dynamics undergo a probabilistic transition to a coherent state, as the magnitude of either the balanced excitation and inhibition or the external inputs is increased. Our mean-field theory successfully predicts the behavior of this model. Furthermore, we numerically demonstrate that the coherent dynamics can be used for state-dependent read-out of information from the network. These results show a novel form of neuronal information processing that connects neuronal dynamics on different scales.Comment: 20 pages 12 figures (main text) + 23 pages 6 figures (Appendix); Some of the results have been removed in the revision in order to reduce the volume. See the previous version for more result

    An Inherent Trade-Off in Noisy Neural Communication with Rank-Order Coding

    Full text link
    Rank-order coding, a form of temporal coding, has emerged as a promising scheme to explain the rapid ability of the mammalian brain. Owing to its speed as well as efficiency, rank-order coding is increasingly gaining interest in diverse research areas beyond neuroscience. However, much uncertainty still exists about the performance of rank-order coding under noise. Herein we show what information rates are fundamentally possible and what trade-offs are at stake. An unexpected finding in this paper is the emergence of a special class of errors that, in a regime, increase with less noise

    Somatodendritic consistency check for temporal feature segmentation

    Get PDF
    The brain identifies potentially salient features within continuous information streams to process hierarchical temporal events. This requires the compression of information streams, for which effective computational principles are yet to be explored. Backpropagating action potentials can induce synaptic plasticity in the dendrites of cortical pyramidal neurons. By analogy with this effect, we model a self-supervising process that increases the similarity between dendritic and somatic activities where the somatic activity is normalized by a running average. We further show that a family of networks composed of the two-compartment neurons performs a surprisingly wide variety of complex unsupervised learning tasks, including chunking of temporal sequences and the source separation of mixed correlated signals. Common methods applicable to these temporal feature analyses were previously unknown. Our results suggest the powerful ability of neural networks with dendrites to analyze temporal features. This simple neuron model may also be potentially useful in neural engineering applications

    Stability versus Neuronal Specialization for STDP: Long-Tail Weight Distributions Solve the Dilemma

    Get PDF
    Spike-timing-dependent plasticity (STDP) modifies the weight (or strength) of synaptic connections between neurons and is considered to be crucial for generating network structure. It has been observed in physiology that, in addition to spike timing, the weight update also depends on the current value of the weight. The functional implications of this feature are still largely unclear. Additive STDP gives rise to strong competition among synapses, but due to the absence of weight dependence, it requires hard boundaries to secure the stability of weight dynamics. Multiplicative STDP with linear weight dependence for depression ensures stability, but it lacks sufficiently strong competition required to obtain a clear synaptic specialization. A solution to this stability-versus-function dilemma can be found with an intermediate parametrization between additive and multiplicative STDP. Here we propose a novel solution to the dilemma, named log-STDP, whose key feature is a sublinear weight dependence for depression. Due to its specific weight dependence, this new model can produce significantly broad weight distributions with no hard upper bound, similar to those recently observed in experiments. Log-STDP induces graded competition between synapses, such that synapses receiving stronger input correlations are pushed further in the tail of (very) large weights. Strong weights are functionally important to enhance the neuronal response to synchronous spike volleys. Depending on the input configuration, multiple groups of correlated synaptic inputs exhibit either winner-share-all or winner-take-all behavior. When the configuration of input correlations changes, individual synapses quickly and robustly readapt to represent the new configuration. We also demonstrate the advantages of log-STDP for generating a stable structure of strong weights in a recurrently connected network. These properties of log-STDP are compared with those of previous models. Through long-tail weight distributions, log-STDP achieves both stable dynamics for and robust competition of synapses, which are crucial for spike-based information processing

    Competition on presynaptic resources enhances the discrimination of interfering memories

    Get PDF
    Evidence suggests that hippocampal adult neurogenesis is critical for discriminating considerably interfering memories. During adult neurogenesis, synaptic competition modifies the weights of synaptic connections nonlocally across neurons, thus providing a different form of unsupervised learning from Hebb’s local plasticity rule. However, how synaptic competition achieves separating similar memories largely remains unknown. Here, we aim to link synaptic competition with such pattern separation. In synaptic competition, adult-born neurons are integrated into the existing neuronal pool by competing with mature neurons for synaptic connections from the entorhinal cortex. We show that synaptic competition and neuronal maturation play distinct roles in separating interfering memory patterns. Furthermore, we demonstrate that a feedforward neural network trained by a competition-based learning rule can outperform a multilayer perceptron trained by the backpropagation algorithm when only a small number of samples are available. Our results unveil the functional implications and potential applications of synaptic competition in neural computation.journal articl

    Extended Temporal Association Memory by Modulations of Inhibitory Circuits

    Get PDF
    Hebbian learning of excitatory synapses plays a central role in storing activity patterns in associative memory models. Interstimulus Hebbian learning associates multiple items by converting temporal correlation to spatial correlation between attractors. Growing evidence suggests the importance of inhibitory plasticity in memory processing, but the consequence of such regulation in associative memory has not been understood. Noting that Hebbian learning of inhibitory synapses yields an anti-Hebbian effect, we show that the combination of Hebbian and anti-Hebbian learning can significantly increase the span of temporal association between correlated attractors as well as the sensitivity of these states to external input. Furthermore, these effects are regulated by changing the ratio of local and global recurrent inhibition after learning weights for excitation-inhibition balance. Our results suggest a nontrivial role of plasticity and modulation of inhibitory circuits in associative memory
    • …
    corecore